model component self-attention mechanism
FedMCSA: Personalized Federated Learning via Model Components Self-Attention
Guo, Qi, Qi, Yong, Qi, Saiyu, Wu, Di, Li, Qian
The standard FL follows three steps: (i) at each iteration, the server distributes the global model to clients; (ii) the client trains the local model on its local private data based on the global model; (iii) the server aggregates local models updated by clients to achieve a new global model, repeated until convergence [1, 4]. FL can ensure effective collaboration between different clients when the data distributions are independent and identically distributed (IID), i.e., private data distributions of clients are similar to each other. However, in many application scenarios, private data of clients may be different in size and class distribution, that is, the data distributions are not independent and identically distributed (Non-IID). In this case, FL may not achieve effective collaboration on different clients due to difference of individual private data [5]. Various algorithms have been proposed to handle the Non-IID data in FL, which can be divided into two categories: average aggregation methods and model-based aggregation methods. As shown in Figure 1(a), average aggregation methods average all local models to generate a global model and distribute it to all clients, where an additional fine-tuning step is performed to train the personalized model in the clients [6, 7, 8, 9].
- North America > United States > California > Los Angeles County > Los Angeles (0.14)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- Europe > Austria (0.04)
- (8 more...)